Hybrid Adaptive Evolutionary Algorithm Hyper Heuristic
نویسنده
چکیده
This paper presents a hyper heuristic that is able to adapt two low level parameters (depth of search rate and mutation intensity) and the probability of applying a low level heuristic to an evolving solution in order to improve the solution quality. Basically, two random subsets of heuristics are maintained: one sub set of the full set of heuristics and other sub-set of the local heuristics. For the full set of heuristics, the random selection is performed in such a way that at least one local heuristic is included in the sub-set. In each iteration, one heuristic is selected from the sub-set of heuristic according to its associated probability (just the heuristics in the subset are considered and their initial probability is the same for all of them). The same process is performed for selecting a local heuristic. Then, the heuristic and the local heuristic that were selected are applied to the candidate solution. The generated solution is compared against the candidate solution. If the solution is improved then the candidate solution will be replaced by the new one and both the heuristic and the local heuristics are rewarded (an increase in the probability of being selected is perform in a random fashion). If not improvement is made, both the heuristic and the local heuristic are punished (a decrease in the probability of being selected is perform in a random fashion), low level heuristic parameters are adjusted and a soft replacement policy is applied. The low level parameter depth of search rate (starting at 0.1) is increased in a random value in the range [0.0, 0.1] up to a maximum value of 0.5. When the depth of search rate reaches 0.5, the parameter is reset to a value between [0.0, 0.1]. The intensity of mutation is set as the inverse value of the depth search rate. The soft replacement policy includes a control variable that determines if the range of depth of search rates has being tested or not. If so a new subset of heuristics and a new subset of local heuristics are selected and a new candidate solution is created by a randomly selected crossover heuristic (between the best reached solution and the generated solution) and a local heuristic. Otherwise the candidate solution is maintained only if the new solution has lower performance. 1 Algorithm description The Hybrid Adaptive Evolutionary Hyper Heuristic (HaEaHH) proposed in this paper is based on the Hybrid Adaptive Evolutionary Algorithm (HaEa) proposed by Gomez in [1]. Since HaEa is an individual based approach, we use only three individuals in the population for evolving a solution: the current candidate solution (parent), the generated solution (child), and the best solution reached during the evolution (best), see Algorithm 1. Basically, the algorithm is divided in four main steps: setting initial variable values (line 1), setting the low level heuristic parameters (lines 3 and 4), offspring generation (line 5), and replacement strategy (line 6).Notice that the dept of search parameter is always set in oppisited manner to the intensity of mutation parameter. When depth of search is high, intensity of mutation is low. 1.1 Initializing Variable Values The initialization process is shown in Algorithm 2. We set the memory size to 3 (line 1), since we use only three individuals in the population for evolving a solution: the current candidate solution (parent), the generated solution (child), and the best solution reached during the evolution (best). We initialize two of them and select the best one as parent and initial best solution (lines 26). We obtain the full set of heuristics and the set of local heuristics (lines 7, 8) and set the “in use” heuristics to 4 and the “in use” local heuristic to be maximum 4 (lines 9, 10). We decided to keep just 4 heuristics (an maximum 4 local heuristics) since the probability of being selected will be no useful if a large number of heuristics are in use (the probability will tend to zero as the number of heuristics increase). The mechanism for selecting and changing the “in use” heuristics and local heuristics, (line 11) is shown in Algorithm 3. This is applied in the replacement strategy when required, see sub section 1.3 for more details. Finally, the Search Depth rate parameter is set to 0.1 meaning that more exploration (mutation) is performed at first than local search. In the reset of the heuristics process (Algorithm 3), the “in use” heuristics are selected from the full set of heuristics after performing a random permutation of the set of heuristics that includes one local heuristic in the first N heuristics, and considering the first N heuristics in the set (line 1). Similar process is performed for the local heuristics and selecting the first M local heuristics as the “in use” ones (line 2). Two heuristics rates vectors are associated to each of the “in use” (local) heuristic sets, every “in use” heuristic having the same probability of being selected (lines 3,4). 1.2 Offspring Generation The offspring generation process is shown in Algorithm 4. In each iteration, one heuristic is selected from the sub-set of “in use” heuristic according to its associated probability (line 1). The same process is performed for selecting a local heuristic (line 2). Then, the heuristic and the local heuristic that were selected are applied (in that order) to the candidate solution (lines 3-8). Notice that when the heuristic is a croosver heuristic, it is perform between the candidate solution and the best solution (line 4). 1 See Table 1 for a reference of the set of variables used by HaEaHH. 2 Just the heuristics in the subset are considered Algorithm 1 Hybrid Adaptive Evolutionary Algorithm (HAEA) HaEaHH( Pr ) 1. init(Pr) 2. while( !hasTimeExpired() ) do 3. setDepthOfSearch(P , α) 4. setIntensityOfMutation(P , β − α) 5. o spring( P ) 6. replacement( P , h, l ) 7. end Algorithm 2 Initializing Variable Values init(P ) 1. setMemorySize(P , 3) 2. initialiseSolution(P , 0) 3. initialiseSolution(P , 1) 4. p =indexOfBest(0, 1) 5. c =indexOfWorst( 0, 1) 6. copySolution(p, b) 7. heu =getHeuristics(P ) 8. loc = getLocalHeuristics(P ) 9. N = 4 10. M = min{N, size(loc)} 11. reset_operators(); 12. α = 0.1 Algorithm 3 Resetting the “in-use” heuristics reset_heuristics() 1. permutation'(heu) 2. permutation(loc)
منابع مشابه
Hyper-Heuristic Based on Iterated Local Search Driven by Evolutionary Algorithm
This paper proposes an evolutionary-based iterative local search hyper-heuristic approach called Iterated Search Driven by Evolutionary Algorithm Hyper-Heuristic (ISEA). Two versions of this algorithm, ISEAchesc and ISEA-adaptive, that differ in the re-initialization scheme are presented. The performance of the two algorithms was experimentally evaluated on six hard optimization problems using ...
متن کاملAdaptive Evolutionary Algorithms and Extensions to the HyFlex Hyper-heuristic Framework
HyFlex is a recently proposed software framework for implementing hyper-heuristics and domain-independent heuristic optimisation algorithms [13]. Although it was originally designed to implement hyperheuristics, it provides a population and a set of move operators of different types. This enable the implementation of adaptive versions of other heuristics such as evolutionary algorithms and iter...
متن کاملSelf-Adaptive Differential Evolution Hyper-Heuristic with Applications in Process Design
The paper presents a differential evolution (DE)-based hyper-heuristic algorithm suitable for the optimization of mixed-integer non-linear programming (MINLP) problems. The hyper-heuristic framework includes self-adaptive parameters, an ε-constrained method for handling constraints, and 18 DE variants as low-level heuristics. Using the proposed approach, we solved a set of classical test proble...
متن کاملMulti-objective Hyper-heuristic Evolutionary Algorithm
This paper presents a Multi-objective Hyper-heuristic Evolutionary Algorithm (MHypEA) for the solution of Scheduling and Inspection Planning in Software Development Projects. Scheduling and Inspection planning is a vital problem in software engineering whose main objective is to schedule the persons to various activities in the software development process such as coding, inspection, testing an...
متن کاملGraph 3-coloring with a hybrid self-adaptive evolutionary algorithm
algorithm Iztok Fister,∗ Marjan Mernik,† and Bogdan Filipič‡ Abstract This paper proposes a hybrid self-adaptive evolutionary algorithm for graph coloring that is hybridized with the following novel elements: heuristic genotype-phenotype mapping, a swap local search heuristic, and a neutral survivor selection operator. This algorithm was compared with the evolutionary algorithm with the SAW met...
متن کاملA Choice Function based hyper-heuristic for Multi-objective Optimization
Hyper-heuristics are emerging methodologies that perform a search over the space of heuristics to solve difficult computational optimization problems. There are two main types of hyper-heuristics: selective and generative hyper-heuristics. An online selective hyper-heuristic framework manages a set of low level heuristics and aims to choose the best one at any given time using a performance mea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011